Kernel Implicit Variational Inference

نویسندگان

  • Jiaxin Shi
  • Shengyang Sun
  • Jun Zhu
چکیده

Recent progress in variational inference has paid much attention to the flexibility of variational posteriors. Work has been done to use implicit distributions, i.e., distributions without tractable likelihoods as the variational posterior. However, existing methods on implicit posteriors still face challenges of noisy estimation and can hardly scale to high-dimensional latent variable models. In this paper, we present an implicit variational inference approach with kernel density ratio fitting that addresses these challenges. As far as we know, for the first time implicit variational inference is successfully applied to Bayesian neural networks, which shows promising results on both regression and classification tasks.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Hierarchical Implicit Models and Likelihood-Free Variational Inference

Implicit probabilistic models are a flexible class of models defined by a simulation process for data. They form the basis for theories which encompass our understanding of the physical world. Despite this fundamental nature, the use of implicit models remains limited due to challenges in specifying complex latent structure in them, and in performing inferences in such models with large data se...

متن کامل

Laplace Variational Iteration Method for Modified Fractional Derivatives with Non-singular Kernel

A universal approach by Laplace transform to the variational iteration method for fractional derivatives with the nonsingular kernel is presented; in particular, the Caputo-Fabrizio fractional derivative and the Atangana-Baleanu fractional derivative with the non-singular kernel is considered. The analysis elaborated for both non-singular kernel derivatives is shown the necessity of considering...

متن کامل

Strong convergence of a general implicit algorithm for variational inequality problems and equilibrium problems and a continuous representation of nonexpansive mappings

We introduce a general implicit algorithm for finding a common element of‎ ‎the set of solutions of systems of equilibrium problems and the set of common fixed points‎ ‎of a sequence of nonexpansive mappings and a continuous representation of nonexpansive mappings‎. ‎Then we prove the strong convergence of the proposed implicit scheme to the unique solution of the minimization problem on the so...

متن کامل

Variational Inference using Implicit Distributions

Generative adversarial networks (GANs) have given us a great tool to fit implicit generative models to data. Implicit distributions are ones we can sample from easily, and take derivatives of samples with respect to model parameters. These models are highly expressive and we argue they can prove just as useful for variational inference (VI) as they are for generative modelling. Several papers h...

متن کامل

Stochastic Variational Deep Kernel Learning

Deep kernel learning combines the non-parametric flexibility of kernel methods with the inductive biases of deep learning architectures. We propose a novel deep kernel learning model and stochastic variational inference procedure which generalizes deep kernel learning approaches to enable classification, multi-task learning, additive covariance structures, and stochastic gradient training. Spec...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017